Перевод: с русского на все языки

со всех языков на русский

Kullback-Leibler divergence

См. также в других словарях:

  • Kullback–Leibler divergence — In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence, information gain, relative entropy, or KLIC) is a non symmetric measure of the difference between two probability distributions P …   Wikipedia

  • Divergence De Kullback-Leibler — En théorie des probabilités et en théorie de l information, la divergence de Kullback Leibler[1] [2] (ou divergence K L ou encore Entropie relative) est une mesure de dissimilarité entre deux distributions de probabilités P et Q. Elle doit son… …   Wikipédia en Français

  • Divergence de kullback-leibler — En théorie des probabilités et en théorie de l information, la divergence de Kullback Leibler[1] [2] (ou divergence K L ou encore Entropie relative) est une mesure de dissimilarité entre deux distributions de probabilités P et Q. Elle doit son… …   Wikipédia en Français

  • Divergence de Kullback-Leibler — En théorie des probabilités et en théorie de l information, la divergence de Kullback Leibler[1], [2] (ou divergence K L ou encore entropie relative) est une mesure de dissimilarité entre deux distributions de probabilités P et Q. Elle doit son… …   Wikipédia en Français

  • Divergence (disambiguation) — Divergence can refer to: In mathematics: Divergence, a function that associates a scalar with every point of a vector field Divergence (computer science), a computation which does not terminate (or terminates in an exceptional state) Divergence… …   Wikipedia

  • Divergence (statistics) — In statistics and information geometry, divergence or a contrast function is a function which establishes the “distance” of one probability distribution to the other on a statistical manifold. The divergence is a weaker notion than that of the… …   Wikipedia

  • Jensen–Shannon divergence — In probability theory and statistics, the Jensen Shannon divergence is a popular method of measuring the similarity between two probability distributions. It is also known as information radius (IRad) [cite book |author=Hinrich Schütze;… …   Wikipedia

  • Solomon Kullback — Dr Solomon Kullback (b. Birth date|1907|04|03 d. Death date|1994|08|05) was a US cryptanalyst and mathematician.Kullback was one of the first three employees hired by William F. Friedman at the US Army s Signal Intelligence Service (SIS) in the… …   Wikipedia

  • Richard Leibler — (March 18, 1914–October 25, 2003) was an American mathematician and cryptanalyst. While working at the NSA, he and Solomon Kullback formulated the Kullback Leibler divergence, a measure of similarity between probability distributions which has… …   Wikipedia

  • Bregman divergence — In mathematics Bregman divergence or Bregman distance is similar to a metric, but does not satisfy the triangle inequality nor symmetry. There are two ways in which Bregman divergences are important. Firstly, they generalize squared Euclidean… …   Wikipedia

  • Information theory — Not to be confused with Information science. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental… …   Wikipedia

Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»